perm filename LECT.SJU[LSP,JRA]3 blob
sn#146398 filedate 1975-02-20 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00014 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002 #1# jan 28
C00012 00003 #2# jan 30
C00016 00004 #3# feb 4
C00022 00005 #4# feb 6
C00030 00006 #5# feb 11
C00033 00007 #6# feb 13
C00034 00008 #7# feb 18
C00035 00009 #8# feb 20
C00039 00010 LECT 2: evaluation
C00042 00011 LECT 3:implementation --static
C00045 00012 LECT 4: compilation and dynamics
C00049 00013 LECT 5: reflections...In a pig's eye!!!
C00054 00014 organized bull
C00059 ENDMK
C⊗;
#1# jan 28
MECHANICS
Books vs. students
take home exams
machine probelms in about a month
additional papers
errata
bibliographical material--some kind of library (beware of rip-off)
general h.s.
much latter missing sections
INTRODUCTION
Why study LISP? The question is more fundamental: Why high level language
for data structures
Why high -level for anything?
Why is Fortran- pl1-algol of value for numerical problems? They allow us to
formulate our problems ar a certain level of abstraction-- to talk about
our problems in mathematical terms, discussing numbers and functions,
rather than bit patterns in the machine. (Ignore approximations in
computations in algorithm or rep of numbers.) Thus the primary benefit of
high-level language is notational. At a certain level, after the
conceptualization is done, we must unusally worry about efficiency.
Efficiency studies can be the difference between a practiacl solution and a
disaster. But the basic understanding must come first. That is the level at
which we shall discuss our algorithms for most of the course. We are
asking for a level of abstraction similar to that which common mathematical
notation gives us for numerical problems. What are the benefits? We hope
to demonstrate in the course that this added abstractness allows us to see
moe clearly exactly what the basic structures of data structure
manipulation are. we will also show that the great majority of what is
modern c.s. IS such algorithms. What will be impressive is that at this
level of abstraction, we can more clearly cover more material with a better
understanding. (DEK vs. mix)
math notation ??
| ↑
↓ |
efficiency of numerical → eff of data struct
alg alg
at this level of abstraction we are interested in values of expressions
rather than the means by which them are computed; cf x=sin(y*3). why is
this good? difficult and pressing problems of c.s. of correctness and/or
equivlanece of progs are addressed most easily at this level. more and
more frequently we are becoming able to write program manipulations systems
which transform progs equivalently...
at this level we can develop the necessary tools to adi in construction of
correct programs. the problems of c.s. are not efficiency, but correctness.
general disclaimer: we will not persue theory for the shear joy of
inflictling pain; neither shall we ignore its benefits when it pokes us in
the eye. Similarly for efficiency; we certainly can talk about relative
efficiency at this higher level, without resorting to counting cycles.
This answers, or begins to answer: WHY HIGH LEVEL? WHy LISP?
why LISP
mechanics
prog lang ideas, wo. hhs. syntax
recursion, functional args and values.
abstract data structures -- data structures as compared with storage structuttes
structured programming --whatever that means
semantics of prog languages -- interpreter to understand menaing of constructs
symbol tables and binding strategies.
**by this time you should be hooked ***
implementation -- realization of evaluator on "real" machine
lists, garbage collectors, storage reclamantion..
parsers, scanners, symbol tables
running and compilation -- easiest and best way to see compilers
assemblers and data structures
debugers, editors, and on-line systems
efficiency -- grubs of alternative implementations
pointer manipulations, threading, double linking
string processors
theory -- denotational semantics, proving correctness
mathematical methods and their importnace
applications -- a.i., t.p., interactive programming
pattern matching, PLANNER and CONNIVER
game playing, program manipulators
implications for language design -- how can we improve
political, philosophical, bull shit -- one man's phil. is anothers b.s.
what does all this have to do with course on data structures?
show relationship ; go back and underline instances of d.s.
taken in context the ideas are quite easy; out of context they are
rather sterile.
MECHANICS OF LISP
attempt to blend theory with practice. we will not persue theory just for
the shear preversity, neither shall we ignore rigor when it will improve
our perception.
BASICS
formal language
domain:symbolic expressions
operations
domain atomic
BNF almost NO syntax here; very D-U-L-L
non-atomic:dotted pairs
atoms
literal atoms=uppercase identifiers
numbers (uninteresting)
non-atomic: dotted pairs
examples.
interpretation as binary trees; easy.
functions: domain, range and graph(set of ordered pairs)
primitive:
cons[x;y] total;forms dottedpair; called constructor
car,cdr partial; first or second; called selectors.
composition allowed; abbreviation of car-cdr chains
informal defintion facility: <=
note: good LISP functions will seldom use explicit car's and cdr's.
we will begin dicussing esthetics next time.
not very exciting so far. need conditional
[p1→e1; ....] semantics
obviously need predicates;
LISP maps true to T and false to NIL thus predicates have values in domain
not truth values as normally understood in math; thus can compose
really should call LISP "predicates" something else: propositional forms
o.w. condition
two primitive: atom and eq
atom total
eq only for atoms
example:fact
equal
subst
fact with aux function
compare computation and mathematics
order of evaluation, undefined and non-terminating computations
examples:
car[cons[x;y]] = x ?; no since y may not terminate or may be undefined
[0=0 → 1; T → 1/0]
f[x;y] <= y if we insist on evaluating x we may lose "unnecessarily"
eg f[car[A];...]
#2# jan 30
Review and examples
*****has everyone seen differentiation of polynomials?*****
king kong
take home exams
machine probelms in about a month
additional papers
errata
bibliographical material--some kind of library (beware of rip-off)
general h.s.
much latter missing sections
office hours 216 1-3 t-th
REVIEW
Why high level: high level is mostly notational convenience
abstraction
clarity: close to our informal formulation
=>easier to debug
=>less dependent on specific immplementation ****very important!!!!!
=> easier to write more complex progs.
=> the reasoning we did to convince ourselves of the correctness of the
informal algorithm will carry over ususally. (except approxmation)
interested in WHAT is computed rather than HOW?
problems in current c.s. are correctness not efficiency.
why LISP -- see as we go along
if you know weissmaan----forget it for now
motivation: efficiency after conceptualization
where we will see "real" data structures.
basics of language a domain - sexprs ; and functions
sexprs binary trees and dotted pairs ---both are representations
binary trees, NO intersections
perhaps BNF for atoms and dotted pairs.
cf. 2 in roman, binary, etc...
examples
functions basic characteristics of d.s. functions: constructors selectors
and recognizers
constructor: cons
selectors: car,cdr tell why
recognizer: atom
predicate: eq (=)
composition of functions
<= intuitive definition
do simple evaluation car[cdr[(A . (B . NIL))]
problems of evaluation, computation and mathemaatics
car[cons[x;y]] = x
f[x;y] <= y
things can get worse with conditionals
review semantics
do f[x;y] <= [g[x] → 1; T→ 1]
do example problems
fact
*** terminology:
recursive function
general case and termination: c.f. induction
use diff as example
***see any goto's?***
equal
subst
#3# feb 4
LIST NOTATION AND ABSTRACTION
Today we will start putting things together. so far we have just seen
a rather start outline of a programming language. What does this have
to do with data structures? What ARE data structures?
What we will do is attempt to characterize a common mathematical
structure, (sequences) as a dtat structure. This investigation will
give us a handle on answering questions like:
1. what is a d.s.
2. how do you characterize one
3. the power of abstraction
4. top down programming, and structured programming --whatever that means.
Note first that this is not simply a sterile exercise in mathematical
chicanery. Sequences are frequently used in numerical problems as arrays
and will appear in the majority of our non-numerical work.
For example
frequently in c.s. we find it convenient to deal with sequences.
examples
(x+y)+(x+z) => 2x+y+z
add[(+,x,y) (+,x,z)] => (+,(+,2x,y),z)
Recall that we have made a beginning in to discussion of sexprs, by noting
the characterization in terms of construtors, selectors, and recognizers.
constructor: list
selectors: first, second, nth
recognizer: null
perhaps islist as additional recognizer: typing
predicate: =
constant: ()
now lets start using these characterizations and see if they are
sufficient and/or most natural.
example length
append
fact1
reverse
do length1
length[x] <= [null[x] → 0; T → plus[1;length[???? : rest[x]
note that we can define second...nth in terms of first and rest.
try a list-constructing function, append
append[x;y] <= [null[x]→ y; T → ??? : concat[x;y]
note that we can defin (almost) list in terms of concat.
append[x;y] <= [null[x]→y; T →concat[first[x];append[rest[x];y]] ]
question of representation: how can we reconcile our
neat usage of list-manipulating fucntions and predicates
with sexprs?? problem of representation
1. rep for lists as sexprs
do BOX NOTATION as embellishment of binary trees
notation for NIL
2. implied rep of functions and predicates as sexpr manipulators
3. i/o conventions
#4# feb 6
Cleanup and review
pre-note: things must be intuitive: theory in cs. should be
used like garlic in cooking.
as things currently stand, you should have a reasonably complete
picture of the world of data structures. In a phrase, almost devoid
of any meaning, it's the problem of representation.
Just as mathematics maps real-world problems onto numerical theories
computer science maps problems onto "theories" over data structures.
homely parallel: machine language in d.s. and roman numerals in mathematics
For example we will soon show the the problems of evaluation, compilers
language implementation are all amenable to this technique. Indeed the
question of representation is quite relative; we will show how
the promitives of LISP (in particular) are easily mapped onto lower-level
representations; i.e. the storage structures.
Once we understand the higher-level representation problems, the
questions of storage and relative efficiency are quite approachable.
we made a start at discovering data structures; note that what is
a data structures at one level is program at another. this the
distinction between prog and data is not as transparent as one would like.
note too that constructors, selectors and recognizers, even with
the I/O conventions, are not the most satisfactory solution. Frequently
we want more control. I.e. there will be cases in which we will want
operations only applicable to specific d.s. rather than applicable in general.
look at attempt to characterize "finite set". look at = for set and seq and bag.
I.e. we want to be able to program at the higher-level of list and sequence
user-defined data structures.
what are the components of programming languages: data structures;
operations(procedures); and control structures( if-then; while;do; go to;
..., and recursion)
What's different about recursion? It's implicit rather than explicit.
or its semantic rather that syntactic.(?)
What can we say about this high-level data structure approach?
this is what top down programming- abstract data structures, structured
programming is all about.. it has very little to do with goto's. its
programming style! structured programING (sic).
also note that if we change representations, then all we need change is
the const, sel, and recog.
homework, problems, and graduate students.
review and do examples:
graphing dotted pair <=> binary tree
list <=> binary tree
functions and evaluation
evaluation should be reasonably intuitive; an extension of
things like "evalute x+y/z when x=2 y=3 z=4" or
"evalute f[2;3;4] when f[x;y;z]=x+y/z "
only now things are a bit more complicated. The complication
is most apparent during recursive evalaution. It can be assuaged
by making explicit substitution, rather than trying to remember current
values.
The real sticky business finally comes when we have to write our
own functions. But who says programming is easy?
LISP at least makes your decisions easy: there is ONLY one way to
write a non-trivial function: recursion! The arguments go like
induction: find the right induction hypothesis and win; find the right
recursion computation and win. Its easier to begin with unary
functions; you don't have to decide what to recur on.
Then you must think of the structure on the argument
sexpr: atom or dotted pair
list: null or non-empty
examples: we will stay away from "meaty" example today; next thing start
doing applications to "real" problems. First must get the mechanisms straight.
member
times over plus
#5# feb 11
More examples:
gcd?
do some involving construction:
pairlis is over sexprs and lists
assoc
try reverse
reverse[x] <= [null[x] → (); T → append[reverse[rest[x]];list[first[x]]]
if you look at a typical computation sequence involved in an evalaution
you will be appalled at the inefficiency. The intuitive computation is
quite simple:
x y
(A B C D) ()
(B C D) (A)
(C D) (B A)
(D) (C B A)
() (D C B A)
Certainly, with the typical language's assignment statement it would
be obvious to do. The subset of LISP which we currently have doesn't have
assignment. There is one obvious answer, but it is valuable to exmine
carefully the power of the tools which we currently have.
look at factorial again.
n! = n*[n-1*[ ....]...]
we need a separate variable to hold the on-going computation
fac[n] <= fac*[n;1]
fac*[n;m] <= [n=0 →m; T → fac*[n-1; m*n]
try separate trick for reverse since the computational effect is similar.
rev[x] <= rev*[ x;()]
rev*[x;y] <= [null[x] → y; T → rev*[rest[x];cons[first[x];y]]
compare rev and fac
f[x] <= f*[ x;G[] ]
f*[x;y] <= [p[x] → g[y]; T → f*[s1[x];c1[s2[x];h[y]]]
relation between append and reverse.
??box notation???
********************************************
nice meaty examples
do example of differentiation.
importance
shows informal alg which is naturally recursive
note if a process even hints of being mechanical
it can be programmed
show mapping of domain onto rep as lists.
shows mapping of intuition to prog (spec) language
show lisp power
will show how to take into abstract algorithm
do selectors: op, arg1 arg1
recognizers: isprod, issum, isconst, isvar
constructors: makesum, makeprod
do example: x*y +x wrt x
#6# feb 13
relationship between abstraction and structured programming, stepwise-
refinement
show pic of trees and fns in parallel.
why numerical examples are not good: essence of problem has been
squeezed into numerical relationships, all structuing is gone; its
implicit rather than explicit.
thinking this way is inefficient?: bull shit! learn to make machines
*************************************
************** DO tgmoaf ************
*************************************
start value
#7# feb 18
value function for polys.
constants, sums and prods
x+y where x=2, y=3: tables
#8# feb 20
more value
f[2;3] where f[x;y] <= x+y
value[e;tbl] <=
[isvar[e] → lookup[e;tbl]
isconst[e] → e;
isfun_args[e] → apply[fun[e];eval_args[args[e];tbl];tbl];
]
lookup[x;tbl] <=
[var[first[tbl]] = x → val[first[tbl]];
T → lookup[x;rest[tbl]]
]
eval_args[args;tbl] <=
[null[args] →();
T → concat[value[first[args];tbl]; eval_args[rest[args];tbl]]
]
apply[fn;eargs;tbl] <=
[is_plus[fn] → +[arg1[eargs];arg2[eargs]]
is_prod[fn] → *[arg1[eargs];arg2[eargs]]
... <other known functions> ...
T → value[body[lookup[fn;tbl]]; new_tbl[vars[lookup[fn;tbl]];eargs;tbl] ]
]
new_tbl[vars;vals;tbl] <=
[null[vars] → tbl;
T → concat[mkent[first[vars];first[vals]];
new_tbl[rest[vars];rest[vals];tbl]
]
]
For cambridge polish representation the following const., sel., recog. suffice:
recognizers:
isvar[x] <= [atom[x] → [numberp[x] → NIL; T → T]; T → NIL]
isconst[x] <= numberp[x]
is_plus[x] <= eq[first[x];PLUS]
is_prod[x] <= eq[first[x];TIMES]
selectors:
fun[x] <= first[x]
args[x] <= rest[x]
(assuming the table is represented as a list of dotted pairs:)
var[x] <= car[x]
val[x] <= cdr[x]
arg1[x] <= first[x]
arg2[x] <= second[x]
(assuming function definitions are stored (F .((X Y)(PLUS X Y)))
body[x] <= second[x]
vars[x] <= first[x]
mkent[x;y] <= cons[x;y]
comments
test march 4; one week
note cleandliness of definition
assumptions are more easily seen: call by value order of evaluation..
machine independent
no car cdrs.
if you knew about a specific machine, you could very easily transform
value into a compiler
difference between compiler and evaluator
why compilers are a hack
why syntax is a hack
on to tgm's
LECT 2: evaluation
review of deriv and representation
intuitive sketch
constants
vars and environment
function calls: decision cbv and l-r
how do we find name
conditionals
example rev1[x,y] <= [null[x] → y; T → rev1[cdr[x];cons[car[x];y]]]
for rev1[(A B C)]
note recursiveness of evaluation
cf. diff example
look at tgmoaf
do mapping of LISP expressions≡forms to sexprs
start eval
question: how to find vars and functions
symb. tabs and assoc for lookup; cons for entry
form-valued vars.
start apply
how isfun recognizes: (FN ...) but compare (COND ...)
special forms: add evcond
var is function; what is f[x;y]: compare cons[A;B] and cons [x;y]
use f[x;y] <= x+y
what about founctions? the λ-calculus
λ-calc
terminology λ-vars,λ-list, body
syntax-pragmatics-semantics
<= looks like assignment? or does it?
DBA
finish basic eval-apply
what about defns: label
label works as assignment in LISP, but ok.
problems of globals
disallow; lose very big.
take previous val;lose:fact
take value when applied;lose funarg
fact-foo DBA example again
x=1 vs. x←1
= as predicate and as equation x=x↑2 + 6
fixpoint
functions get messy: get notation: weizenbbaum B-W.
functional args: what does it mean? foo[x,y] <= x[y]
rep as sexpr
functional args passed in
functions as values
but label is good for lisp
adding functions
adding other things: special forms; and[x1, xn]
point to implementations and p-lists
scott: parallel sit to λ-calc 'til 1969
for lisp
why: over-specify
precision
therefore mathematics available for proofs of properties
on (yecch) machine
define and evalquote
LECT 3:implementation --static
rewiew eval with implementation in mind
eval constants and conditionals
apply priitives, lambdas and fn names
question why not handle cond inside apply
sppecial forms: another example, "and"
two kinds of evaluation--extend to user
another problem: definitions go away eval[e, env]
two solutions start with BIG env'; or hand code then into eval and apply
bindings
globals: dynamic binding; look at structure of alist
can't outlaw; c.f. fact; different solution: fix-point
functional args
coming in
going out
implementation
funarg hack: lisp was first
(FUNARG fn st)
do weizen examples
implementation
separate out symbol table: eval[e]
different properties expr, fexpr
efficient
box notation NIL and atom header
properties: subr fsubr expr fexpr value
attribute-val pairs
atom header thus atom[x]
search: GET; add PUTPROP
unique atoms (via read) thus eq[x;y]
what about printing (A.B) thus PNAME
worry about bindign after we get machine set up.
what about cons: free space and full word space
gc: base pointers: symbol table and partial comp
basic structure of lisp mchinne; hs. about inst and data
read eval print loop
details or read-ratom , parser,scanner
hashing, oblist,buckets ***if time***
ratom-unique atom
do (CONS (QUOTE A)(QUOTE B))
now finally, bindings
in apply
islambda => save old, bind new, eval body, restore old, buzz off
recall assoc
look at value cell...special push-down , mscw
advantages, fast save and restore
disadvantages, funargs
recall old
now at defn save old sp
at apply restore to state at old sp, while savving current
bind locals and go, at end, rebind flushing to st at entry.
sigh
bootstapping
enxt time compilers, and machines
wed??? implicarrions and scott
LECT 4: compilation and dynamics
**************************
*****tell about problem on p149.******
put evlis and evcon on the board
***************************
REVIEW
p-lists
values-cell and binding
funargs
structure of new eval.. notice now look first at fn before args
indivators expr fexpr value macro
do "and"
compilers, machines and implementation of recursion
what is it: relation to interpretation; nothing conceptual
why compile: speed + extra sexpr space..conceptually no better
bootstrapping
structure compiler: generators
machine to machine
lang to extension
binary program space (hexidecimal)
LISP comilers don't hack syntax
compilers in tgm-steps
functions composition and constant args
refer to evlis
calling conventions: stack+ac1-acn
value returned convention
comp,push, comp, push ..... loadac[m]
do it; MUST since crucial to following var arg!!
integrity of stack
add conditionals
refer to evcond
linearize cond tree
start comcond[((p1 e1) ...(pn en))]
convention on truth,jumpe;gensym
write comcond; and add recognizer in compexp
one pass assmeblers
fixups and forward refs
add local variables
consider λ[x,y,z] ....x.....y.....z
conventions says @call v(x), v(y), v(z) in ac1,2,3; save 'em;
partial comp changes t.o.s.
who? complis.
good example j[x y] <= f[g[x];h[y]]
convention: ...-n P)
offset + rel pos
lookup is offset+cdr[assoc...]
conventions: λ[x y z] => prup, PUSH, PUSH, ...
clean-up stack and POPJ
add globals shallow and deep
stack won't work as is: only value is saved on stack;special cell
deep as in interlisp; save name-value; like a-list; compiler can be smart
stupid interpreter; stupid interlisp
compiling other parts of lisp: fexprs, macros. functional args. progs.
debuggers and editors
tracing: hide defn; replace with tracer; similar monitor on assignment
what should LISP have:
documentation: damn tired of people reinventing McCarthy's ideas
user-defined ds.
structuring s.t. not only an impl. in the lang but THE impl in the lang
sequences, structures, pointers
abstract control
better closures
form vars
LECT 5: reflections...In a pig's eye!!!
to: pdp-11 hackers: look at LISP as if you were bootstrappint to 11
don't look at all the details.
reflections
remarks on short coourse and teaching
students, particularly grad students should be stired up
teachers are not dispensers of wisdom in this field
structured programming
progrm is not, programiing should be.
structuted apl(NO WAY), cf siftup (BOO!)
what is interesting
syntax isn't
compilation isn't
compilers, syntax analysis and pl1 have done much to retart c.s.
semantics is
structure of language system
problems in programming are not in efficiency but in correctness!!!!!!!
people gasp at represetnation of lisp-like inefficiency, but are
conditioned to long programming development and debugging time, buggy
programs and crap. even if machines were made ten-times faster
debugginh problem would not go away,(but LISp interpretation
would be acceptable)
required: n. nec fol proofs but proofs with "conviction".
COMPARE AI AND THEORY: mismatch between titles and content of papers
ai: program which plays chess => 4x4 board with two pieces
theory: program verification system for pascal-lisp-pl1
all languages are bad: some are worse than others
waht is lisp-like lang; why is it better/different.
why is LISP still around? very strange reasons given...
apg: 1)formal; 2) guess what i'm thinking
what is feasible, desireable, useful...
not mind reader
not knowledge of programming area
programs which try to be smart lose very big.
know about programming, types, ...
competent assitant is reasonable
semantics
operational
axiomatic
denotational
language: syntax pragmatics and semantics
benefits of denotational modeling
precision => provability
don't overspecify cf. order in cond
correctness of compilers and evaluation
complier correctness is different from prog correctness
equivalence-Boyer Moore
manipulation: darlington
lcf and Scott
cf. axions+ corrrectness, completeness proofs
semantic equations, existence of model
mal
correctness of compilers and interpreters for lisp subsets
axioms for fromal manipulation of lisp progrs.
mg
scott for lisp
semantic equations
model
rules of imference
tennent
good popularizer, QUEST
milne
attempts to analyze "realistic" impple. inreadible
wadsworth
analysis of λ-cal
implications
el1 and s-lisp
planner
calling by pattern
data bases
conniver
add -- if-added
remove -- if-removed
fetch -- if-needed try-next and possibilities lists
make prog look like data
control and access environs
adieu and au-revoir
contexts for d.b.'s
actors and control???
list of papers to read
humble programmer
three computer cultures
authors, wegner, scott, reynolds,
organized bull
comments on teaching short course
high level vs. details --picked h.l. because that's interesting
lack of time for questions -- very bad
cf. (F A1 A2 ... AN)
1) ignore
2) "not the way it's done anyway
missed parts
efficiency applications a-i languages
what is wrong with lisp
1) NOT syntax
2) closures not done well
3) d.s. weak
a) for user
b) for LISP i.e. implementation of THE interpreter ont AN int
what is lisp-like lang
1) interpreter based
2) which works on NATURAL rep of prog.
clearly other languages can effect the map-- all are disgustingly the same.
but lisp did it interntionally
why it's good for you
software problem: get programs which work. demonstrably correct
problem is conceptual not technological
l.s system
do NOT want "smart" system. pain in ass
do NOT want "mind-reader" system. pain in ass
invaribly "stupid".
structuring of dtructured programming occurs in construction
cf. siftup
what is structured programming. (sic)ing.
easier to say what it is NOT. not block structure, not APL
system components language; command language to manipualtee progs
editor debugger, verifier, interpreter, compiler
how to make lisp better
user defined data stuctures
what is d.s.? dunno. again what it is not c.f. actors and DCL
more that stupid program...conceptually different
c.f. everything is atom and driving to LA.
anicdote on CDR and DBA
like type...abstract syntax
abstract control....cf. set var
operations must map somehow to ppl.
system must be able to manipulate programs to tell if correct.
semantics
axiomatic
axioms and rules of inference, reduction, simplification rules
hoare
mg for LISP
λ-calc
order of evaluation built into LISP but not into λ-calc
normal forms in λ-cal
a) not everything has fn., but if it does outside-in cbn gets it
b) things with distinct nf are distinct ∃ args st. applying gets
distinct objs
c) not true for non nf. ∃ distinct exprs which under natural interpretation
are same...how do you prove such?
operational
in terms of machine
TM ...boo
compiler...boo
vdl...sign
lisp ....
denotational
langueage = syntax+ pragmatics+semantics
cf. λ-calc
proc is just math function
mg.
denotational model for lisp
reduction rules for lisp
red(<e,a>) =A <=> e(a)=A i.e. e is expression. it is a function which when
applied to arg ( env) gives value(denotaion)
also eval(e*,a*) = e(a)